Fisher Information Matrix and its Application of Bipolar Activation Function Based Multilayer Perceptrons With General Gaussian Input

نویسندگان

چکیده

For the widely used multilayer perceptrons (MLPs), there exist singularities in parameter space where Fisher information matrix (FIM) degenerates on these subspaces. The seriously influence learning dynamics of MLPs which have attracted many researchers’ attentions. As FIM plays key role investigating singular MLPs, it is very important to obtain analytical form FIM. In this paper, for bipolar activation function based with general Gaussian input, by choosing error as function, are obtained. Then validity obtained results verified taking two experiments.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Key Diagonal Blocks of the Fisher Information Matrix on Neural Manifold of Full-Parametrised Multilayer Perceptrons

Abstract: It’s well known the natural gradient learning (NGL) ([1]) may avoid global optima or phenomena of plateau in the training process since it takes into consideration the intrinsic geometric structure of the parameter space. But, natural gradient ([1]) is itself induced by Fisher information matrix (FIM) ([2]) defined on the 1-form tangent space ([3]), therefore calculation of relevant F...

متن کامل

construction and validation of translation metacognitive strategy questionnaire and its application to translation quality

like any other learning activity, translation is a problem solving activity which involves executing parallel cognitive processes. the ability to think about these higher processes, plan, organize, monitor and evaluate the most influential executive cognitive processes is what flavell (1975) called “metacognition” which encompasses raising awareness of mental processes as well as using effectiv...

Quantile regression with multilayer perceptrons

We consider nonlinear quantile regression involving multilayer perceptrons (MLP). In this paper we investigate the asymptotic behavior of quantile regression in a general framework. First by allowing possibly non-identifiable regression models like MLP's with redundant hidden units, then by relaxing the conditions on the density of the noise. In this paper, we present an universal bound for the...

متن کامل

Data classification with multilayer perceptrons using a generalized error function

The learning process of a multilayer perceptron requires the optimization of an error function E(y,t) comparing the predicted output, y, and the observed target, t. We review some usual error functions, analyze their mathematical properties for data classification purposes, and introduce a new one, E(Exp), inspired by the Z-EDM algorithm that we have recently proposed. An important property of ...

متن کامل

Nonlinear Channel Equalization Using Multilayer Perceptrons with Information-theoretic Criterion

The minimum error entropy criterion was recently suggested in adaptive system training as an alternative to the mean-square-error criterion, and it was shown to produce better results in many tasks. In this paper, we apply a multiplayer perceptron scheme trained with this information theoretic criterion to the problem of nonlinear channel equalization. In our simulations, we use a realistic non...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2022

ISSN: ['2169-3536']

DOI: https://doi.org/10.1109/access.2022.3227427